Goto

Collaborating Authors

 Nice


High-probabilitycomplexityguaranteesfornonconvex minimaxproblems

Neural Information Processing Systems

To this end, high-probability guarantees have been considered in the literature [35, 64, 20, 32, 22]. These results allow to control the risk associated with the worst-case tail events as theyspecify howmanyiterations would be sufficient toensureG(xk,yk) issufficiently small foranygivenfailure probability q (0,1).





Learning-AugmentedPriority Queues

Neural Information Processing Systems

Their primary objective is to efficiently support the insertion of new elements with assigned priorities and the extraction of the highest priorityelement.



One-step differentiation of iterative algorithms

Neural Information Processing Systems

For iterative algorithms, implicit differentiation alleviates this issue but requires custom implementation of Jacobian evaluation. In this paper, we study one-step differentiation, also known as Jacobian-free backpropagation, a method as easy as automatic differentiation and as efficient as implicit differentiation for fast algorithms (e.g., superlinear